Continuous - Time Markov Decision Processes 243
نویسنده
چکیده
منابع مشابه
On $L_1$-weak ergodicity of nonhomogeneous continuous-time Markov processes
In the present paper we investigate the $L_1$-weak ergodicity of nonhomogeneous continuous-time Markov processes with general state spaces. We provide a necessary and sufficient condition for such processes to satisfy the $L_1$-weak ergodicity. Moreover, we apply the obtained results to establish $L_1$-weak ergodicity of quadratic stochastic processes.
متن کاملSufficiency of Markov Policies for Continuous-Time Markov Decision Processes and Solutions of Forward Kolmogorov Equation for Jump Markov Processes
In continuous-time Markov decision processes (CTMDPs) with Borel state and action spaces, unbounded transition rates, for an arbitrary policy, we construct a relaxed Markov policy such that the marginal distribution on the stateaction pairs at any time instant is the same for both the policies. This result implies the existence of a relaxed Markov policy that performs equally to an arbitrary po...
متن کاملUniformization in Markov Decision Processes
Continuous-time Markov decision processes (CTMDP) may be viewed as a specialcase of semi-Markov decision processes (SMDP) where the intertransition times are exponen-tially distributed and the decision maker is allowed to choose actions whenever the systemstate changes. When the transition rates are identical for each state and action pair, one canconvert a CTMDP into an equival...
متن کاملContinuous time Markov decision processes
In this paper, we consider denumerable state continuous time Markov decision processes with (possibly unbounded) transition and cost rates under average criterion. We present a set of conditions and prove the existence of both average cost optimal stationary policies and a solution of the average optimality equation under the conditions. The results in this paper are applied to an admission con...
متن کاملTranslation Invariant Exclusion Processes ( Book in Progress ) c © 2003
1 Markov chains and Markov processes 4 1.1 Discrete-time Markov chains . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.2 Continuous-time Markov chains . . . . . . . . . . . . . . . . . . . . . . . . . 6 1.3 General definitions for Markov processes . . . . . . . . . . . . . . . . . . . . 10 1.4 Poisson processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 1.5 H...
متن کامل